1,615 research outputs found

    New Equations for Neutral Terms: A Sound and Complete Decision Procedure, Formalized

    Get PDF
    The definitional equality of an intensional type theory is its test of type compatibility. Today's systems rely on ordinary evaluation semantics to compare expressions in types, frustrating users with type errors arising when evaluation fails to identify two `obviously' equal terms. If only the machine could decide a richer theory! We propose a way to decide theories which supplement evaluation with `ν\nu-rules', rearranging the neutral parts of normal forms, and report a successful initial experiment. We study a simple -calculus with primitive fold, map and append operations on lists and develop in Agda a sound and complete decision procedure for an equational theory enriched with monoid, functor and fusion laws

    Transporting Functions across Ornaments

    Get PDF
    Programming with dependent types is a blessing and a curse. It is a blessing to be able to bake invariants into the definition of data-types: we can finally write correct-by-construction software. However, this extreme accuracy is also a curse: a data-type is the combination of a structuring medium together with a special purpose logic. These domain-specific logics hamper any effort of code reuse among similarly structured data. In this paper, we exorcise our data-types by adapting the notion of ornament to our universe of inductive families. We then show how code reuse can be achieved by ornamenting functions. Using these functional ornament, we capture the relationship between functions such as the addition of natural numbers and the concatenation of lists. With this knowledge, we demonstrate how the implementation of the former informs the implementation of the latter: the user can ask the definition of addition to be lifted to lists and she will only be asked the details necessary to carry on adding lists rather than numbers. Our presentation is formalised in a type theory with a universe of data-types and all our constructions have been implemented as generic programs, requiring no extension to the type theory

    Everybody's got to be somewhere

    Get PDF
    The key to any nameless representation of syntax is how it indicates the variables we choose to use and thus, implicitly, those we discard. Standard de Bruijn representations delay discarding maximally till the leaves of terms where one is chosen from the variables in scope at the expense of the rest. Consequently, introducing new but unused variables requires term traversal. This paper introduces a nameless 'co-de-Bruijn' representation which makes the opposite canonical choice, delaying discarding minimally, as near as possible to the root. It is literate Agda: dependent types make it a practical joy to express and be driven by strong intrinsic invariants which ensure that scope is aggressively whittled down to just the support of each subterm, in which every remaining variable occurs somewhere. The construction is generic, delivering a universe of syntaxes with higher-order metavariables, for which the appropriate notion of substitution is hereditary. The implementation of simultaneous substitution exploits tight scope control to avoid busywork and shift terms without traversal. Surprisingly, it is also intrinsically terminating, by structural recursion alone

    Dependently Typed Functional Programs and their Proofs

    Get PDF
    Research in dependent type theories [ML71a] has, in the past, concentrated on its use in the presentation of theorems and theorem-proving. This thesis is concerned mainly with the exploitation of the computational aspects of type theory for programming, in a context where the properties of programs may readily be specified and established. In particular, it develops technology for programming with dependent inductive families of datatypes and proving those programs correct. It demonstrates the considerable advantage to be gained by indexing data structures with pertinent characteristic information whose soundness is ensured by typechecking, rather than human effort. Type theory traditionally presents safe and terminating computation on inductive datatypes by means of elimination rules which serve as induction principles and, via their associated reduction behaviour, recursion operators [Dyb91]. In the programming language arena, these appear somewhat cumbersome and give rise to unappealing code, complicated by the inevitable interaction between case analysis on dependent types and equational reasoning on their indices which must appear explicitly in the terms. Thierry Coquand's proposal [Coq92] to equip type theory directly with the kind of pattern matching notation to which functional programmers have become used over the past three decades [Bur69, McB70] offers a remedy to many of these difficulties. However, the status of pattern matching relative to the traditional elimination rules has until now been in doubt. Pattern matching implies the uniqueness of identity proofs, which Martin Hofmann showed underivable from the conventional definition of equality [Hof95]. This thesis shows that the adoption of this uniqueness as axiomatic is sufficient t make pattern matching admissible. A datatype's elimination rule allows abstraction only over the whole inductively defined family. In order to support pattern matching, the application of such rules to specific instances of dependent families has been systematised. The underlying analysis extends beyond datatypes to other rules of a similar second order character, suggesting they may have other roles to play in the specification, verification and, perhaps, derivation of programs. The technique developed shifts the specificity from the instantiation of the type's indices into equational constraints on indices freely chosen, allowing the elimination rule to be applied. Elimination by this means leaves equational hypothesis in the resulting subgoals, which must be solved if further progress is to be made. The first-order unification algorithm for constructor forms in simple types presented in [McB96] has been extended to cover dependent datatypes as well, yielding completely automated solution to a class of problems which can be syntactically defined. The justification and operation of these techniques requires the machine to construct and exploit a standardised collection of auxiliary lemmas for each datatype. This is greatly facilitated by two technical developments of interest in their own right: a more convenient definition of equality, with a relaxed formulation rule allowing elements of different types to be compared, but nonetheless equivalent to the usual equality plus the axiom of uniqueness; a type theory, OLEG, which incorporates incomplete objects, accounting for their `holes' entirely within the typing judgments and, novelly, not requiring any notion of explicit substitution to manage their scopes. A substantial prototype has been implemented, extending the proof assistant LEGO [LP92]. A number of programs are developed by way of example. Chiefly, the increased expressivity of dependent datatypes is shown to capture a standard first-order unification algorithm within the class of structurally recursive programs, removing any need for a termination argument. Furthermore, the use of elimination rules in specifying the components of the program simplifies significantly its correctness proof

    Doo bee doo bee doo

    Get PDF
    We explore the design and implementation of Frank, a strict functional programming language with a bidirectional effect type system designed from the ground up around a novel variant of Plotkin and Pretnar's effect handler abstraction. Effect handlers provide an abstraction for modular effectful programming: a handler acts as an interpreter for a collection of commands whose interfaces are statically tracked by the type system. However, Frank eliminates the need for an additional effect handling construct by generalising the basic mechanism of functional abstraction itself. A function is but the special case of a Frank operator that interprets no commands. Moreover, Frank's operators can be multihandlers which simultaneously interpret commands from several sources at once, without disturbing the direct style of functional programming with values. Effect typing in Frank employs a novel form of effect polymorphism which avoids mentioning effect variables in source code. This is achieved by propagating an ambient ability inwards, rather than accumulating unions of potential effects outwards. With the ambient ability describing the effects that are available at a certain point in the code, it can become necessary to reconfigure access to the ambient ability. A primary goal is to be able to encapsulate internal effects, eliminating a phenomenon we call effect pollution. Moreover, it is sometimes desirable to rewire the effect flow between effectful library components. We propose adaptors as a means for supporting both effect encapsulation and more general rewiring. Programming with effects and handlers is in its infancy. We contribute an exploration of future possibilities, particularly in combination with other forms of rich type systems

    Variations on Inductive-Recursive Definitions

    Get PDF
    Dybjer and Setzer introduced the definitional principle of inductive-recursively defined families - i.e. of families (U : Set, T : U -> D) such that the inductive definition of U may depend on the recursively defined T --- by defining a type DS D E of codes. Each c : DS D E defines a functor [c] : Fam D -> Fam E, and (U, T) = mu [c] : Fam D is exhibited as the initial algebra of [c]. This paper considers the composition of DS-definable functors: Given F : Fam C -> Fam D and G : Fam D -> Fam E, is G circ F : Fam C -> Fam E DS-definable, if F and G are? We show that this is the case if and only if powers of families are DS-definable, which seems unlikely. To construct composition, we present two new systems UF and PN of codes for inductive-recursive definitions, with UF a subsytem of DS a subsystem of PN. Both UF and PN are closed under composition. Since PN defines a potentially larger class of functors, we show that there is a model where initial algebras of PN-functors exist by adapting Dybjer-Setzer\u27s proof for DS

    A type- and scope-safe universe of syntaxes with binding: their semantics and proofs

    Get PDF
    Almost every programming language's syntax includes a notion of binder and corresponding bound occurrences, along with the accompanying notions of alpha-equivalence, capture-avoiding substitution, typing contexts, runtime environments, and so on. In the past, implementing and reasoning about programming languages required careful handling to maintain the correct behaviour of bound variables. Modern programming languages include features that enable constraints like scope safety to be expressed in types. Nevertheless, the programmer is still forced to write the same boilerplate over again for each new implementation of a scope safe operation (e.g., renaming, substitution, desugaring, printing, etc.), and then again for correctness proofs. We present an expressive universe of syntaxes with binding and demonstrate how to (1) implement scope safe traversals once and for all by generic programming; and (2) how to derive properties of these traversals by generic proving. Our universe description, generic traversals and proofs, and our examples have all been formalised in Agda and are available in the accompanying material available online at https://github.com/gallais/generic-syntax

    Type systems for programs respecting dimensions

    Get PDF
    Type systems can be used for tracking dimensional consistency of numerical computations: we present an extension from dimensions of scalar quantities to dimensions of vectors and matrices, making use of dependent types from programming language theory. We show that our types are unique, and most general. We further show that we can give straightforward dimensioned types to many common matrix operations such as addition, multiplication, determinants, traces, and fundamental row operations
    • …
    corecore